Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[tests] make cuda-only cases in TestModelAndLayerStatus device-agnostic #2026

Merged
merged 3 commits into from
Aug 21, 2024

Conversation

faaany
Copy link
Contributor

@faaany faaany commented Aug 21, 2024

After the fix, all tests in TestModelAndLayerStatus pass on XPU:

================================================ short test summary info ================================================
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_layer_names_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_layer_names_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_module_type_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_module_type_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_enabled_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_enabled_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_enabled_irregular
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_active_adapters_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_active_adapters_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_merge_adapters_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_merge_adapters_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_requires_grad_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_requires_grad_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_requires_grad_irregular
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_available_adapters_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_available_adapters_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_devices_all_cpu_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_devices_all_cpu_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_devices_all_cuda_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_devices_cpu_and_cuda_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_base_model_type_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_base_model_type_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_base_model_type_transformers_automodel
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_adapter_model_type_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_adapter_model_type_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_peft_types_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_peft_types_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_nb_params_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_nb_params_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_num_adapter_layers_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_num_adapter_layers_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_enabled_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_enabled_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_disabled_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_disabled_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_enabled_irregular
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_active_adapters_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_active_adapters_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_active_adapters_irregular
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_merged_adapters_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_merged_adapters_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_merged_adapters_irregular
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_requires_grad_model_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_requires_grad_model_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_requires_grad_model_irregular
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_available_adapters_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_available_adapters_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_devices_all_cpu_small
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_devices_all_cpu_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_devices_all_cuda_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_devices_cpu_and_cuda_large
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_loha_model
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_vera_model
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_transformers_model
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_model_with_injected_layers
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_vanilla_model_raises
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_transformer_model_without_adapter_raises
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_prefix_tuning
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_adaption_prompt
PASSED tests/test_tuners_utils.py::TestModelAndLayerStatus::test_mixed_model_raises
============================================ 60 passed, 7 warnings in 10.21s ============================================

@faaany faaany changed the title [tests] enable cuda-only cases on XPU in TestModelAndLayerStatus [tests] make cuda-only cases in TestModelAndLayerStatus device-agnostic Aug 21, 2024
@faaany
Copy link
Contributor Author

faaany commented Aug 21, 2024

@BenjaminBossan , the require_non_cpu function is from the accelerate library. I am not sure whether direct import from accelerate would be better instead of having the same in testing_utils.py. Furthermore, I could use require_hardware_accelerator instead of require_non_cpu. What's your thought on it?

@faaany
Copy link
Contributor Author

faaany commented Aug 21, 2024

I also plan to make tests in test_common_gpu.py and test_gpu_examples.py device-agnostic, because none of them require cuda-specific kernels and can be extended to other GPUs like Intel GPU.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for making the test device agnostic, the solution looks good.

@BenjaminBossan BenjaminBossan merged commit 6c832c1 into huggingface:main Aug 21, 2024
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants